Improving neural networks generalization with new constructive and pruning methods

نویسندگان

  • Marcelo Azevedo Costa
  • Antônio de Pádua Braga
  • Benjamin Rodrigues de Menezes
چکیده

This paper presents a new constructive method and pruning approaches to control the design of Multi-Layer Perceptron (MLP) without loss in performance. The proposed methods use a multi-objective approach to guarantee generalization. The constructive approach searches for an optimal solution according to the pareto set shape with increasing number of hidden nodes. The pruning methods are able to simplify the network topology and to identify linear connections between the inputs and outputs of the neural model. Topology information and validation sets are used.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparative Study of Neural Network Optimization Techniques

In the last years we developed ENZO, an evolutionary neural network optimizer which surpasses other algorithms with regard to performance and scalability. In this study we compare ENZO to standard techniques for topology optimization: Optimal Brain Surgeon (OBS), Magnitude based Pruning (MbP), and to an improved algorithm deduced from OBS (unit-OBS). Furthermore we compare results to a newly pr...

متن کامل

Pruning Strategies for the MTiling Constructive Learning Algorithm

We present a framework for incorporating pruning strategies in the MTiling constructive neural network leaming algorithm. Pruning involves elimination of redundant elements (connection weights or neurons) from a network a d is nf considerable practical interest. We describe three elementary sensitivity based strategies for pruning neurons. Experimental results demonstrate a inoderate to signifi...

متن کامل

Several Aspects of Pruning Methods in Recursive Least Square Algorithms for Neural Networks

Recently, recursive least square (RLS), or extended Kalman ltering (EKF), based algorithms have been demonstrated to be a class of eeective online training methods for neural networks. This paper discusses several aspects of pruning a neural network trained by the RLS based approach. Based on our study, the RLS approach is implicitly a weight decay training algorithm. Also, we derive two prunin...

متن کامل

Pruning recurrent neural networks for improved generalization performance

Determining the architecture of a neural network is an important issue for any learning task. For recurrent neural networks no general methods exist that permit the estimation of the number of layers of hidden neurons, the size of layers or the number of weights. We present a simple pruning heuristic that significantly improves the generalization performance of trained recurrent networks. We il...

متن کامل

Optimal Feed-Forward Neural Networks Based on the Combination of Constructing and Pruning by Genetic Algorithms

The determination of the proper size of an artificial neural network (ANN) is recognized to be crucial, especially for its practical implementation in important issues such as learning and generalization. In this paper, an effective designing method of neural network architectures is presented. The network is firstly trained by a dynamic constructive method until the error is satisfied. The tra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of Intelligent and Fuzzy Systems

دوره 13  شماره 

صفحات  -

تاریخ انتشار 2002